首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   304篇
  免费   7篇
电工技术   1篇
化学工业   26篇
金属工艺   2篇
机械仪表   3篇
建筑科学   4篇
能源动力   3篇
轻工业   5篇
水利工程   1篇
无线电   21篇
一般工业技术   40篇
冶金工业   15篇
原子能技术   1篇
自动化技术   189篇
  2023年   1篇
  2022年   1篇
  2021年   4篇
  2020年   1篇
  2019年   2篇
  2018年   4篇
  2017年   11篇
  2016年   13篇
  2015年   8篇
  2014年   13篇
  2013年   25篇
  2012年   19篇
  2011年   28篇
  2010年   22篇
  2009年   15篇
  2008年   16篇
  2007年   13篇
  2006年   13篇
  2005年   9篇
  2004年   14篇
  2003年   13篇
  2002年   7篇
  2001年   8篇
  2000年   8篇
  1999年   1篇
  1998年   3篇
  1997年   8篇
  1996年   5篇
  1995年   4篇
  1994年   1篇
  1992年   3篇
  1991年   1篇
  1988年   1篇
  1987年   2篇
  1986年   4篇
  1985年   3篇
  1984年   2篇
  1983年   2篇
  1982年   1篇
  1979年   1篇
  1977年   1篇
排序方式: 共有311条查询结果,搜索用时 15 毫秒
1.
Several new protocols such as RBUDP, User-Level UDP, Tsunami, and SABUL, have been proposed as alternatives to TCP for high speed data transfer. The purpose of this paper is to analyze the effects of SABUL congestion control algorithm on SABUL performance metrics such as bandwidth utilization, self-fairness, aggressiveness and average packet losses. We propose simple deterministic and stochastic models of SABUL congestion control algorithm and use the models to assess these metrics. Our results explain SABUL throughput oscillations, derive bounds on its aggressiveness/responsiveness, show that SABUL can be self-fair, and identify conditions under which SABUL connections may experience excessive packet losses. This work was sponsored in part by the National Science Foundation grant No. NSF-9901004, DOE SciDAC grant DE-FC02-01ER25484, and IBM Corp. Shared University Research Program.  相似文献   
2.
The TSIMMIS Approach to Mediation: Data Models and Languages   总被引:18,自引:2,他引:16  
TSIMMIS—The Stanford-IBM Manager of Multiple InformationSources—is a system for integrating information. It offers a datamodel and a common query language that are designed to support thecombining of information from many different sources. It also offerstools for generating automatically the components that are needed tobuild systems for integrating information. In this paper we shalldiscuss the principal architectural features and their rationale.  相似文献   
3.
4.
Generalized Hirsch h-index for disclosing latent facts in citation networks   总被引:1,自引:0,他引:1  
What is the value of a scientist and its impact upon the scientific thinking? How can we measure the prestige of a journal or a conference? The evaluation of the scientific work of a scientist and the estimation of the quality of a journal or conference has long attracted significant interest, due to the benefits by obtaining an unbiased and fair criterion. Although it appears to be simple, defining a quality metric is not an easy task. To overcome the disadvantages of the present metrics used for ranking scientists and journals, J. E. Hirsch proposed a pioneering metric, the now famous h-index. In this article we demonstrate several inefficiencies of this index and develop a pair of generalizations and effective variants of it to deal with scientist ranking and publication forum ranking. The new citation indices are able to disclose trendsetters in scientific research, as well as researchers that constantly shape their field with their influential work, no matter how old they are. We exhibit the effectiveness and the benefits of the new indices to unfold the full potential of the h-index, with extensive experimental results obtained from the DBLP, a widely known on-line digital library.  相似文献   
5.
When a set of rules generates (conflicting) values for a virtual attribute of some tuple, the system must resolve the inconsistency and decide on a unique value that is assigned to that attribute. In most current systems, the conflict is resolved based on criteria that choose one of the rules in the conflicting set and use the value that it generated. There are several applications, however, where inconsistencies of the above form arise, whose semantics demand a different form of resolution. We propose a general framework for the study of the conflict resolution problem, and suggest a variety of resolution criteria, which collectively subsume all previously known solutions. With several new criteria being introduced, the semantics of several applications are captured more accurately than in the past. We discuss how conflict resolution criteria can be specified at the schema or the rule-module level. Finally, we suggest some implementation techniques based on rule indexing, which allow conflicts to be resolved efficiently at compile time, so that at run time only a single rule is processed.An earlier version of this work appeared under the title Conflict Resolution of Rules Assigning Values to Virtual Attributes inProceedings of the 1989 ACM-Sigmod Conference, Portland, OR, June 1989, pp. 205–214.Partially supported by the National Science Foundation under Grant IRI-9157368 (PYI Award) and by grants from DEC, HP, and AT&T.Partially supported by the National Science Foundation under Grant IRI-9057573 (PYI Award), IBM, DEC, and the University of Maryland Institute for Advanced Computer Studies (UMIACS).  相似文献   
6.
Efficient and effective processing of the distance-based join query (DJQ) is of great importance in spatial databases due to the wide area of applications that may address such queries (mapping, urban planning, transportation planning, resource management, etc.). The most representative and studied DJQs are the K Closest Pairs Query (KCPQ) and εDistance Join Query (εDJQ). These spatial queries involve two spatial data sets and a distance function to measure the degree of closeness, along with a given number of pairs in the final result (K) or a distance threshold (ε). In this paper, we propose four new plane-sweep-based algorithms for KCPQs and their extensions for εDJQs in the context of spatial databases, without the use of an index for any of the two disk-resident data sets (since, building and using indexes is not always in favor of processing performance). They employ a combination of plane-sweep algorithms and space partitioning techniques to join the data sets. Finally, we present results of an extensive experimental study, that compares the efficiency and effectiveness of the proposed algorithms for KCPQs and εDJQs. This performance study, conducted on medium and big spatial data sets (real and synthetic) validates that the proposed plane-sweep-based algorithms are very promising in terms of both efficient and effective measures, when neither inputs are indexed. Moreover, the best of the new algorithms is experimentally compared to the best algorithm that is based on the R-tree (a widely accepted access method), for KCPQs and εDJQs, using the same data sets. This comparison shows that the new algorithms outperform R-tree based algorithms, in most cases.  相似文献   
7.
8.
Coxiella burnetii, the causative agent of Q fever, is an intracellular bacterium and a potential weapon for bioterrorism. The widespread throughout the world, zoonosis is manifested clinically as a self-limited febrile illness, as pneumonia (acute Q fever) or as a chronic illness with endocarditis being its major complication. The recent Netherlands Q fever outbreak has driven the bacterium from a relatively cryptic, underappreciated, “niche” microorganism on the sideline of bacteriology, to one of possibly great impact on public health. Advances in the study of this microorganism proceeded slowly, primarily due to the, until recently, obligatory intracellular nature of the pathogen that in its virulent phase I must be manipulated under biosafety level-3 conditions. Proteomic studies, in particular, have generated a vast amount of information concerning several aspects of the bacterium such as virulence factors, detection/diagnostic and immunogenic biomarkers, inter-/intraspecies variation, resistance to antibiotics, and secreted effector proteins with significant clinical impact. The phenomenon observed following the genomics era, that of generation and accumulation of huge amount of data that ultimately end up unexploited on several databases, begins to emerge in the proteomics field as well. This review will focus on the advances in the field of C. burnetii proteomics through MS, attempting in parallel to utilize some of the proteomics findings by suggesting future directions for the improvement of Q fever diagnosis and therapy.  相似文献   
9.
Soil fabric anisotropy tensors are related to the statistical distribution of orientation of different microstructural vector-like entities, most common being the contact normal vectors between particles, which are extremely difficult to determine for real granular materials. On the other hand, void fabric based tensors can be determined by image based quantification methods of voids (graphical approaches), which are well defined and easy to apply to both physical and numerical experiments. A promising void fabric characterization approach is based on the scan line method. Existing scan line based definitions of void fabric anisotropy tensors are shown analytically to inherit a shortcoming, since numerous small void segments in a sample have an inordinate contribution towards unwarranted isotropy. Discrete Element Method (DEM) of analysis subsequently confirms this analytical proof. The fact that such scan line void fabric tensor definitions yield acceptable results when used in conjunction with physical image-based measurements, is shown to be attributed to the natural “cut off” of smaller void segments that occurs during such measurements. This is the motivation to propose using the existing definition of void fabric tensors, with exclusion of void segments less than a “cut off” value associated with an internal length of the granular assembly. In addition, an entirely new void fabric tensor was introduced using the squared length, instead of the length of a void segment, as the weighting factor for the definition of the scan line void fabric tensor. It was found by means of DEM analysis that both alternative definitions are void of the aforementioned shortcoming and compatible with existing image quantification methods of void fabric anisotropy.  相似文献   
10.
Transactional memory is being advanced as an alternative to traditional lock-based synchronization for concurrent programming. Transactional memory simplifies the programming model and maximizes concurrency. At the same time, transactions can suffer from interference that causes them to often abort, from heavy overheads for memory accesses, and from expressiveness limitations (e.g., for I/O operations). In this paper we propose an adaptive locking technique that dynamically observes whether a critical section would be best executed transactionally or while holding a mutex lock. The critical new elements of our approach include the adaptivity logic and cost–benefit analysis, a low-overhead implementation of statistics collection and adaptive locking in a full C compiler, and an exposition of the effects on the programming model. In experiments with both micro and macrobenchmarks we found adaptive locks to consistently match or outperform the better of the two component mechanisms (mutexes or transactions). Compared to either mechanism alone, adaptive locks often provide 3-to-10x speedups. Additionally, adaptive locks simplify the programming model by reducing the need for fine-grained locking: with adaptive locks, the programmer can specify coarse-grained locking annotations and often achieve fine-grained locking performance due to the transactional memory mechanisms.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号